54 research outputs found

    Two-step Bayesian Structure Health Monitoring Approach for IASC-ASCE Phase II Simulated and Experimental Benchmark Studies

    Get PDF
    This report uses a two-step probabilistic structural health monitoring approach to analyze the Phase II simulated and experimental benchmark studies sponsored by the IASC-ASCE Task Group on Structural Health Monitoring. The studies involve damage detection and assessment of the test structure using simulated ambient-vibration data and experimental data generated by various excitations. The two-step approach involves modal identification followed by damage assessment using the pre- and post-damage modal parameters based on the Bayesian updating methodology. An Expectation-Maximization algorithm is proposed to find the most probable values of the parameters. The results of the analysis show that the probabilistic approach is able to detect and assess most damage locations involving stiffness losses of braces in the braced frame cases, while the success of the approach in detecting rotational stiffness losses of the beam-column connections in the untraced cases may rely on sufficient prior information for the column stiffness

    Uncertainty Propagation and Feature Selection for Loss Estimation in Performance-based Earthquake Engineering

    Get PDF
    This report presents a new methodology, called moment matching, of propagating the uncertainties in estimating repair costs of a building due to future earthquake excitation, which is required, for example, when assessing a design in performance-based earthquake engineering. Besides excitation uncertainties, other uncertain model variables are considered, including uncertainties in the structural model parameters and in the capacity and repair costs of structural and non-structural components. Using the first few moments of these uncertain variables, moment matching requires only a few well-chosen point estimates to propagate the uncertainties to estimate the first few moments of the repair costs with high accuracy. Furthermore, the use of moment matching to estimate the exceedance probability of the repair costs is also addressed. These examples illustrate that the moment-matching approach is quite general; for example, it can be applied to any decision variable in performance-based earthquake engineering. Two buildings are chosen as illustrative examples to demonstrate the use of moment matching, a hypothetical three-story shear building and a real seven-story hotel building. For these two examples, the assembly-based vulnerability approach is employed when calculating repair costs. It is shown that the moment-matching technique is much more accurate than the well-known First-Order-Second-Moment approach when propagating the first two moments, while the resulting computational cost is of the same order. The repair-cost moments and exceedance probability estimated by the moment-matching technique are also compared with those by Monte Carlo simulation. It is concluded that as long as the order of the moment matching is sufficient, the comparison is satisfactory. Furthermore, the amount of computation for moment matching scales only linearly with the number of uncertain input variables. Last but not least, a procedure for feature selection is presented and illustrated for the second example. The conclusion is that the most important uncertain input variables among the many influencing the uncertainty in future repair costs are, in order of importance, ground-motion spectral acceleration, component capacity, ground-motion details and unit repair costs

    Impact of sample path smoothness on geotechnical reliability

    Get PDF
    The scale of fluctuation (SOF) of a spatially variable soil property has been known to be the most important parameter that characterizes the effect of spatial averaging, and the type of the auto-correlation model is thought to be of limited impact. This paper shows that this statement (SOF is the most important parameter) is true if the limit state function is completely governed by spatial averaging. However, this paper also shows that the sample path smoothness can have signifcant impact if the limit state function is not completely governed by spatial averaging. Three practical examples are presented to illustrate the effect of sample path smoothness.The authors would like to thank Dr. Yu-Gang Hu and Miss Tzu-Ting Lin for their efforts in producing the results in some plots

    Real-time Bayesian State Estimation of Uncertain Dynamical Systems

    Get PDF
    The focus of this report is real-time Bayesian state estimation using nonlinear models. A recently developed method, the particle filter, is studied that is based on Monte Carlo simulation. Unlike the well-known extended Kalman filter, it is applicable to highly nonlinear systems with non-Gaussian uncertainties. Recently developed techniques that improve the convergence of the particle filter simulations are also introduced and discussed. Comparisons between the particle filter and the extended Kalman filter are made using several numerical examples of nonlinear systems. The results indicate that the particle filter provides consistent state and parameter estimates for highly nonlinear systems, while the extended Kalman filter does not. The particle filter is applied to a real-data case study: a 7-story hotel whose structural system consists of non-ductile reinforced-concrete moment frames, one of which was severely damaged during the 1994 Northridge earthquake. Two identification models are proposed: a timevarying linear model and a simplified time-varying nonlinear degradation model. The latter is derived from a nonlinear finite-element model of the building previously developed at Caltech. For the former model, the resulting performance is poor since the parameters need to vary significantly with time in order to capture the structural degradation of the building during the earthquake. The latter model performs better because it is able to characterize this degradation to a certain extent even with its parameters fixed. Once again, the particle filter provides consistent state and parameter estimates, in contrast to the extended Kalman filter. It is concluded that for a state estimation procedure to be successful, at least two factors are essential: an appropriate estimation algorithm and a suitable identification model. Finally, recorded motions from the 1994 Northridge earthquake are used to illustrate how to do real-time performance evaluation by computing estimates of the repair costs and probability of component damage for the hotel

    Real-time Loss Estimation for Instrumented Buildings

    Get PDF
    Motivation. A growing number of buildings have been instrumented to measure and record earthquake motions and to transmit these records to seismic-network data centers to be archived and disseminated for research purposes. At the same time, sensors are growing smaller, less expensive to install, and capable of sensing and transmitting other environmental parameters in addition to acceleration. Finally, recently developed performance-based earthquake engineering methodologies employ structural-response information to estimate probabilistic repair costs, repair durations, and other metrics of seismic performance. The opportunity presents itself therefore to combine these developments into the capability to estimate automatically in near-real-time the probabilistic seismic performance of an instrumented building, shortly after the cessation of strong motion. We refer to this opportunity as (near-) real-time loss estimation (RTLE). Methodology. This report presents a methodology for RTLE for instrumented buildings. Seismic performance is to be measured in terms of probabilistic repair cost, precise location of likely physical damage, operability, and life-safety. The methodology uses the instrument recordings and a Bayesian state-estimation algorithm called a particle filter to estimate the probabilistic structural response of the system, in terms of member forces and deformations. The structural response estimate is then used as input to component fragility functions to estimate the probabilistic damage state of structural and nonstructural components. The probabilistic damage state can be used to direct structural engineers to likely locations of physical damage, even if they are concealed behind architectural finishes. The damage state is used with construction cost-estimation principles to estimate probabilistic repair cost. It is also used as input to a quantified, fuzzy-set version of the FEMA-356 performance-level descriptions to estimate probabilistic safety and operability levels. CUREE demonstration building. The procedure for estimating damage locations, repair costs, and post-earthquake safety and operability is illustrated in parallel demonstrations by CUREE and Kajima research teams. The CUREE demonstration is performed using a real 1960s-era, 7-story, nonductile reinforced-concrete moment-frame building located in Van Nuys, California. The building is instrumented with 16 channels at five levels: ground level, floors 2, 3, 6, and the roof. We used the records obtained after the 1994 Northridge earthquake to hindcast performance in that earthquake. The building is analyzed in its condition prior to the 1994 Northridge Earthquake. It is found that, while hindcasting of the overall system performance level was excellent, prediction of detailed damage locations was poor, implying that either actual conditions differed substantially from those shown on the structural drawings, or inappropriate fragility functions were employed, or both. We also found that Bayesian updating of the structural model using observed structural response above the base of the building adds little information to the performance prediction. The reason is probably that Real-Time Loss Estimation for Instrumented Buildings ii structural uncertainties have only secondary effect on performance uncertainty, compared with the uncertainty in assembly damageability as quantified by their fragility functions. The implication is that real-time loss estimation is not sensitive to structural uncertainties (saving costly multiple simulations of structural response), and that real-time loss estimation does not benefit significantly from installing measuring instruments other than those at the base of the building. Kajima demonstration building. The Kajima demonstration is performed using a real 1960s-era office building in Kobe, Japan. The building, a 7-story reinforced-concrete shearwall building, was not instrumented in the 1995 Kobe earthquake, so instrument recordings are simulated. The building is analyzed in its condition prior to the earthquake. It is found that, while hindcasting of the overall repair cost was excellent, prediction of detailed damage locations was poor, again implying either that as-built conditions differ substantially from those shown on structural drawings, or that inappropriate fragility functions were used, or both. We find that the parameters of the detailed particle filter needed significant tuning, which would be impractical in actual application. Work is needed to prescribe values of these parameters in general. Opportunities for implementation and further research. Because much of the cost of applying this RTLE algorithm results from the cost of instrumentation and the effort of setting up a structural model, the readiest application would be to instrumented buildings whose structural models are already available, and to apply the methodology to important facilities. It would be useful to study under what conditions RTLE would be economically justified. Two other interesting possibilities for further study are (1) to update performance using readily observable damage; and (2) to quantify the value of information for expensive inspections, e.g., if one inspects a connection with a modeled 50% failure probability and finds that the connect is undamaged, is it necessary to examine one with 10% failure probability

    Microseismic source deconvolution: Wiener filter versus minimax, Fourier versus wavelets, and linear versus nonlinear

    Full text link
    Deconvolution is commonly performed on microseismic signals to determine the time history of a dislocation source, usually modeled as combinations of forces or couples. This paper presents a new deconvolution method that uses a nonlinear thresholding estimator, which is based on the minimax framework and operates in the wavelet domain. Experiments were performed on a steel plate using artificially generated microseismic signals, which were recorded by high-fidelity displacement sensors at various locations. The source functions were deconvolved from the recorded signals by Wiener filters and the new method. Results were compared and show that the new method outperforms the other methods in terms of reducing noise while keeping the sharp features of the source functions. Other advantages of the nonlinear thresholding estimator include (1) its performance is close to that of a minimax estimator, (2) it is nonlinear and takes advantage of sparse representations under wavelet bases, and (3) its computation is faster than the fast Fourier transforms. (C) 2004 Acoustical Society of America

    Reliabilty-based design for allowable bearing capacity of footings on rock masses by considering angle of distortion

    No full text
    100學年度研究獎補助論文[[abstract]]This study addresses the impact of spatial variability on the angle of distortion between two footings in rockmasses. A simple elasto-perfectly-plastic model based on the Hoek–Brown criterion is taken to simulate the spatial variation of rockmass properties in the finite element analyses. This model is calibrated by a large rockmass database. With Monte Carlo simulations, stochastic samples of angle of distortion between two footings are obtained, which are further used to derive reliability-basedallowablebearing stresses. The analysis results show that the geological strength index (GSI) of rockmasses and uniaxial compressive strength of intact rock are the two dominate factors that affect the reliability-baseddesign. Comparisons to the existing codes show that these codes are appropriate for poor to fair rockmasses, conservative for good to very good rockmasses and un-conservative for very poor rockmasses.[[incitationindex]]SCI[[booktype]]紙

    Estimation of rock pressure during an excavation/cut in sedimentary rocks with inclined bedding planes

    No full text
    [[abstract]]The estimation of rock pressure induced by an excavation/cut in sedimentary rocks is addressed in this study. A simplified stochastic model is proposed to model this rock pressure to account for sliding along parallel bedding planes as well as random friction angles on these bedding planes. Simulations show that the classical Rankine and Coulomb theories typically give active pressures much larger than those predicted by the proposed model. A simplified reliability-based design approach is developed to calibrate the required partial factors for the determination of design rock pressure. The proposed approach is demonstrated over a case study for northern Taiwan. Design charts are developed to facilitate the determination of design rock pressures induced by excavation/cut in sedimentary rocks.[[incitationindex]]SCI[[booktype]]紙
    corecore